63 research outputs found
A Multichannel Spatial Compressed Sensing Approach for Direction of Arrival Estimation
The final publication is available at http://link.springer.com/chapter/10.1007%2F978-3-642-15995-4_57ESPRC Leadership Fellowship EP/G007144/1EPSRC Platform Grant EP/045235/1EU FET-Open Project FP7-ICT-225913\"SMALL
SRA: Fast Removal of General Multipath for ToF Sensors
A major issue with Time of Flight sensors is the presence of multipath
interference. We present Sparse Reflections Analysis (SRA), an algorithm for
removing this interference which has two main advantages. First, it allows for
very general forms of multipath, including interference with three or more
paths, diffuse multipath resulting from Lambertian surfaces, and combinations
thereof. SRA removes this general multipath with robust techniques based on
optimization. Second, due to a novel dimension reduction, we are able to
produce a very fast version of SRA, which is able to run at frame rate.
Experimental results on both synthetic data with ground truth, as well as real
images of challenging scenes, validate the approach
Job Scheduling Using successive Linear Programming Approximations of a Sparse Model
EuroPar 2012In this paper we tackle the well-known problem of scheduling a collection of parallel jobs on a set of processors either in a cluster or in a multiprocessor computer. For the makespan objective, i.e., the completion time of the last job, this problem has been shown to be NP-Hard and several heuristics have already been proposed to minimize the execution time. We introduce a novel approach based on successive linear programming (LP) approximations of a sparse model. The idea is to relax an integer linear program and use lp norm-based operators to force the solver to find almost-integer solutions that can be assimilated to an integer solution. We consider the case where jobs are either rigid or moldable. A rigid parallel job is performed with a predefined number of processors while a moldable job can define the number of processors that it is using just before it starts its execution. We compare the scheduling approach with the classic Largest Task First list based algorithm and we show that our approach provides good results for small instances of the problem. The contributions of this paper are both the integration of mathematical methods in the scheduling world and the design of a promising approach which gives good results for scheduling problems with less than a hundred processors
Random forests with random projections of the output space for high dimensional multi-label classification
We adapt the idea of random projections applied to the output space, so as to
enhance tree-based ensemble methods in the context of multi-label
classification. We show how learning time complexity can be reduced without
affecting computational complexity and accuracy of predictions. We also show
that random output space projections may be used in order to reach different
bias-variance tradeoffs, over a broad panel of benchmark problems, and that
this may lead to improved accuracy while reducing significantly the
computational burden of the learning stage
Sparsity without the Complexity: Loss Localisation using Tree Measurements
We study network loss tomography based on observing average loss rates over a
set of paths forming a tree -- a severely underdetermined linear problem for
the unknown link loss probabilities. We examine in detail the role of sparsity
as a regularising principle, pointing out that the problem is technically
distinct from others in the compressed sensing literature. While sparsity has
been applied in the context of tomography, key questions regarding uniqueness
and recovery remain unanswered. Our work exploits the tree structure of path
measurements to derive sufficient conditions for sparse solutions to be unique
and the condition that minimization recovers the true underlying
solution. We present a fast single-pass linear algorithm for
minimization and prove that a minimum solution is both unique and
sparsest for tree topologies. By considering the placement of lossy links
within trees, we show that sparse solutions remain unique more often than is
commonly supposed. We prove similar results for a noisy version of the problem
Efficient MR Image Reconstruction for Compressed MR Imaging
Abstract. In this paper, we propose an efficient algorithm for MR image reconstruction. The algorithm minimizes a linear combination of three terms corresponding to a least square data fitting, total variation (TV) and L1 norm regularization. This has been shown to be very powerful for the MR image reconstruction. First, we decompose the original prob-lem into L1 and TV norm regularization subproblems respectively. Then, these two subproblems are efficiently solved by existing techniques. Fi-nally, the reconstructed image is obtained from the weighted average of solutions from two subproblems in an iterative framework. We compare the proposed algorithm with previous methods in term of the recon-struction accuracy and computation complexity. Numerous experiments demonstrate the superior performance of the proposed algorithm for com-pressed MR image reconstruction.
Blind Deconvolution via Lower-Bounded Logarithmic Image Priors
In this work we devise two novel algorithms for blind deconvolution based on a family of logarithmic image priors. In contrast to recent approaches, we consider a minimalistic formulation of the blind deconvolution problem where there are only two energy terms: a least-squares term for the data fidelity and an image prior based on a lower-bounded logarithm of the norm of the image gradients. We show that this energy formulation is sufficient to achieve the state of the art in blind deconvolution with a good margin over previous methods. Much of the performance is due to the chosen prior. On the one hand, this prior is very effective in favoring sparsity of the image gradients. On the other hand, this prior is non convex. Therefore, solutions that can deal effectively with local minima of the energy become necessary. We devise two iterative minimization algorithms that at each iteration solve convex problems: one obtained via the primal-dual approach and one via majorization-minimization. While the former is computationally efficient, the latter achieves state-of-the-art performance on a public dataset
From Social Data Mining to Forecasting Socio-Economic Crisis
Socio-economic data mining has a great potential in terms of gaining a better
understanding of problems that our economy and society are facing, such as
financial instability, shortages of resources, or conflicts. Without
large-scale data mining, progress in these areas seems hard or impossible.
Therefore, a suitable, distributed data mining infrastructure and research
centers should be built in Europe. It also appears appropriate to build a
network of Crisis Observatories. They can be imagined as laboratories devoted
to the gathering and processing of enormous volumes of data on both natural
systems such as the Earth and its ecosystem, as well as on human
techno-socio-economic systems, so as to gain early warnings of impending
events. Reality mining provides the chance to adapt more quickly and more
accurately to changing situations. Further opportunities arise by individually
customized services, which however should be provided in a privacy-respecting
way. This requires the development of novel ICT (such as a self- organizing
Web), but most likely new legal regulations and suitable institutions as well.
As long as such regulations are lacking on a world-wide scale, it is in the
public interest that scientists explore what can be done with the huge data
available. Big data do have the potential to change or even threaten democratic
societies. The same applies to sudden and large-scale failures of ICT systems.
Therefore, dealing with data must be done with a large degree of responsibility
and care. Self-interests of individuals, companies or institutions have limits,
where the public interest is affected, and public interest is not a sufficient
justification to violate human rights of individuals. Privacy is a high good,
as confidentiality is, and damaging it would have serious side effects for
society.Comment: 65 pages, 1 figure, Visioneer White Paper, see
http://www.visioneer.ethz.c
A Bayesian non-parametric clustering approach for semi-supervised Structural Health Monitoring
A key challenge in Structural Health Monitoring (SHM) is the lack of availability of datafrom a full range of changing operational and damage conditions, with which to train anidentification/classification algorithm. This paper presents a framework based onBayesian non-parametric clustering, in particular Dirichlet Process (DP) mixture models,for performing SHM tasks in a semi-supervised manner, including an online feature extrac-tion method. Previously, methods applied for SHM of structures in operation, such asbridges, have required at least a year’s worth of data before any inferences on performanceor structural condition can be made. The method introduced here avoids the need for train-ing data to be collected before inference can begin and increases in robustness as more dataare added online. The method is demonstrated on two datasets; one from a laboratory test,the other from a full scale test on civil infrastructure. Results show very good classificationaccuracy and the ability to incorporate information online (e.g. regarding environmentalchanges)
- …